End User Tools for Evaluating Scholarly Content

نویسنده

  • Carol Anne Meyer
چکیده

The existence of multiple versions of scholarly content (from author websites, institutional repositories, government archives, subject-specific digital libraries, aggregator collections and publisher websites) make it difficult for users to locate the most recent version of a document or to ascertain if the document has had any updates or even been retracted. This session describes tools for end users to evaluate the content they come across to make sure they are citing the most authoritative version of the content available. The reader will learn about the CrossMark version of record service and the importance of educating users about how to locate current information. What Happens When Scholarly Content Changes? Marc Hauser was a well-known primate researcher at Harvard University. He was on the faculty in the Psychology Department, ran the Cognitive Evolution Laboratory, and was a popular teacher and a leader in his field. Not unusually, the lab had its own web site, and publications of affiliate faculty and researchers were listed and copies were hosted on this site. One of the papers on the site, a 2002 article co-authored by Hauser, became the focus of a year-long Harvard University ethics investigation. The results of the investigation were unfortunate for Hauser. The paper, originally published in the Elsevier journal Cognition, was retracted, the case was made public in the Boston Globe, he was suspended, and he ultimately left Harvard. Published scholarly research is supposed to be selfcorrecting. But what happens when articles that have been corrected, updated or even retracted, as in the case of this paper, are still available in their original form? I first came across the case of Marc Hauser in December 2010. At that time, the retracted paper was still available on the Cognitive Evolution Labs Web site as a PDF. The article was formatted for publication in Cognition, and was clearly labeled “Article in Press.” There was no mention of a retraction or any investigation. In preparation for an article I was writing and several subsequent presentations, I checked the lab’s site in January, March, June, and November of 2011. It wasn’t until November that the PDF was removed from the lab’s site. In fact, the entire site had disappeared by that time. At no time was there any indication that the paper had been retracted. If you follow the CrossRef DOI link to the paper, you find that the publisher has clearly marked it “Retracted” by adding the word to the title. So the researcher who is using the publisher version will pretty clearly be able to learn about the retraction. If, on the other hand, a researcher starts at Google, and we heard at another session at this conference a study of librarians, faculty and students that reminds us that that is exactly what academics do, and he or she types in a search for the article “Rule Learning by Cotton-top Tamarins” the first result is the PDF of this article, on another site (Citeseer), which does not indicate the article has been retracted. The second result is to PubMed, which does have a way to notify researchers of retractions, but not of other types of corrections. Of course Google results vary. When I retried this search in preparing this article for the proceedings version of this talk, the first result was in a group from Google Scholar, and had a link to the properly identified retracted article at Cognition, but one of the other 13 results for the same citation was a stillavailable PDF of the paper on the co-author’s departmental web site at New York University. v In today’s world scholars have a broad array of sources for the research they rely on, from author home pages, to institutional repositories, web aggregators, government repositories, multiple ebook formats and readers, and federated search engines. This array is convenient, but it also presents challenges for notifying readers when something has happened to a document after peer review and publication. The Concerns of Librarians About a year ago, CrossRef conducted several focus groups with librarians. We wanted to know whether they had concerns about different versions of arti-

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Prototyping and Evaluation of Hospital Dashboard through End-User Computing Satisfaction Model (EUCS)

In today's competitive environment, one of the new tools in the field ofinformation technology is business or organizational dashboards that are as abackup in the process of strategic management of organizations. The aim ofthis study is building a prototype of a hospital dashboard on the principles andguidelines of dashboards and evaluating it based on End UserComputing Satisfaction (EUCS). The...

متن کامل

VIZ-VIVO: Towards Visualizations-driven Linked Data Navigation

Scholars@Cornell is a new project of Cornell University Library (CUL) that provides linked data and novel visualizations of the scholarly record. Our goal is to enable easy discovery of explicit and latent patterns that can reveal high-impact research areas, the dynamics of scholarly collaboration, and expertise of faculty and researchers. We describe VIZ-VIVO, an extension for the VIVO framewo...

متن کامل

Automatic Construction of Evaluation Sets and Evaluation of Document Similarity Models in Large Scholarly Retrieval Systems

Retrieval systems for scholarly literature offer the ability for the scientific community to search, explore and download scholarly articles across various scientific disciplines. Mostly used by the experts in the particular field, these systems contain user community logs including information on user specific downloaded articles. In this paper we present a novel approach for automatically eva...

متن کامل

Babel: A Platform for Facilitating Research in Scholarly Article Discovery

The body of scientific literature is growing at an exponential rate. This expansion of scientific knowledge has increased the need for tools to help users find relevant articles. However, researchers developing new scholarly article recommendation algorithms face two substantial hurdles: acquiring high-quality, large-scale scholarly metadata and mechanisms for evaluating their recommendation al...

متن کامل

On the Question of How Web 2.0 Features Support Critical Map Reading

Web 2.0 technologies enable users to produce and distribute their own content. The variety of motives for taking part in these communication processes leads to considerable differences in levels of quality. While social media contexts have developed features for evaluating contributions, user-generated maps frequently do not offer tools to question or examine the origin and elements of user-gen...

متن کامل

Case study: Redesigning a Kansei Engineering Designed Scissors by User Centered Design Approach

This paper is based on the research which was conducted earlier on Kansei Engineering (KE) and resulted in a new concept for scissors to redesign it with another method called “User Centered Design” (UCD). This is a shift from translation of the consumers’ psychological feeling about a product related to their perception of the design (KE) to focus on designing for and involving users in the de...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017